tensor decomposition
Statistical mechanics of low-rank tensor decomposition
Often, large, high dimensional datasets collected across multiple modalities can be organized as a higher order tensor. Low-rank tensor decomposition then arises as a powerful and widely used tool to discover simple low dimensional structures underlying such data. However, we currently lack a theoretical understanding of the algorithmic behavior of low-rank tensor decompositions. We derive Bayesian approximate message passing (AMP) algorithms for recovering arbitrarily shaped low-rank tensors buried within noise, and we employ dynamic mean field theory to precisely characterize their performance. Our theory reveals the existence of phase transitions between easy, hard and impossible inference regimes, and displays an excellent match with simulations. Moreover, it reveals several qualitative surprises compared to the behavior of symmetric, cubic tensor decomposition. Finally, we compare our AMP algorithm to the most commonly used algorithm, alternating least squares (ALS), and demonstrate that AMP significantly outperforms ALS in the presence of noise.
- North America > United States (1.00)
- Africa > Senegal > Kolda Region > Kolda (0.05)
- Europe > Germany > Hesse > Darmstadt Region > Frankfurt (0.04)
- Energy (0.93)
- Government > Regional Government > North America Government > United States Government (0.68)
- Asia > Myanmar > Tanintharyi Region > Dawei (0.04)
- Asia > China > Beijing > Beijing (0.04)
- Asia > China > Tianjin Province > Tianjin (0.04)
- (2 more...)
- North America > United States > California > Santa Clara County > Palo Alto (0.04)
- Asia > Middle East > Republic of Türkiye > Bingoel Province > Bingol (0.04)
- Africa > Senegal > Kolda Region > Kolda (0.04)
- (3 more...)
- North America > United States > Rhode Island > Providence County > Providence (0.04)
- North America > Canada (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States > Wisconsin > Dane County > Madison (0.04)
- Africa > Senegal > Kolda Region > Kolda (0.04)
- South America > Brazil (0.04)
- (11 more...)
- Health & Medicine > Therapeutic Area > Neurology (1.00)
- Health & Medicine > Pharmaceuticals & Biotechnology (0.68)
Legendre Decomposition for Tensors
Mahito Sugiyama, Hiroyuki Nakahara, Koji Tsuda
CP decomposition compresses an input tensor into a sum of rank-one components, and Tucker decomposition approximates an input tensor by a core tensor multiplied by matrices. To date, matrix and tensor decomposition has been extensively analyzed, and there are a number of variations of such decomposition (Kolda and Bader, 2009), where the common goal is to approximate a given tensor by a smaller number of components, or parameters,inanefficientmanner. However, despite the recent advances of decomposition techniques, a learning theory that can systematically define decomposition for any order tensors including vectors and matrices is still under development. Moreover, it is well known that CP and Tucker tensor decomposition include non-convex optimization and that the global convergence is not guaranteed.
- Africa > Senegal > Kolda Region > Kolda (0.25)
- Asia > Japan > Honshū > Kantō > Tokyo Metropolis Prefecture > Tokyo (0.05)
- North America > Canada > Quebec > Montreal (0.04)
- Africa > Senegal > Kolda Region > Kolda (0.05)
- North America > United States > Colorado > Boulder County > Boulder (0.05)
- North America > United States > Ohio > Franklin County > Columbus (0.04)
- (4 more...)
- Africa > Senegal > Kolda Region > Kolda (0.05)
- North America > United States > Massachusetts (0.04)
- North America > United States > Illinois > Champaign County > Urbana (0.04)